Gradient Descent and Radial Basis Functions
نویسندگان
چکیده
In this paper, we present experiments comparing different training algorithms for Radial Basis Functions (RBF) neural networks. In particular we compare the classical training which consists of an unsupervised training of centers followed by a supervised training of the weights at the output, with the full supervised training by gradient descent proposed recently in same papers. We conclude that a fully supervised training performs generally better. We also compare Batch training with Online training and we conclude that Online training suppose a reduction in the number of iterations.
منابع مشابه
On the construction and training of reformulated radial basis function neural networks
Presents a systematic approach for constructing reformulated radial basis function (RBF) neural networks, which was developed to facilitate their training by supervised learning algorithms based on gradient descent. This approach reduces the construction of radial basis function models to the selection of admissible generator functions. The selection of generator functions relies on the concept...
متن کاملApproximation of Function by Adaptively Growing Radial Basis Function Neural Networks
In this paper a neural network for approximating function is described. The activation functions of the hidden nodes are the Radial Basis Functions (RBF) whose parameters are learnt by a two-stage gradient descent strategy. A new growing radial basis functions-node insertion strategy with different radial basis functions is used in order to improve the net performances. The learning strategy is...
متن کاملReformulated radial basis neural networks trained by gradient descent
This paper presents an axiomatic approach for constructing radial basis function (RBF) neural networks. This approach results in a broad variety of admissible RBF models, including those employing Gaussian RBF's. The form of the RBF's is determined by a generator function. New RBF models can be developed according to the proposed approach by selecting generator functions other than exponential ...
متن کاملAn Experimental Study on Training Radial Basis Functions by Gradient Descent
In this paper, we present experiments comparing different training algorithms for Radial Basis Functions (RBF) neural networks. In particular we compare the classical training which consist of an unsupervised training of centers followed by a supervised training of the weights at the output, with the full supervised training by gradient descent proposed recently in same papers. We conclude that...
متن کاملGlobal Optimization of RBF Networks
Several modifications to parameter estimation in a Radial Basis Functions network are introduced. These include a better initializing clustering algorithm and a full gradient descent on centers and weights after weights were found via a matrix inversion. Performance comparison with other RBF algorithms is given on several data-sets. It is found that The proposed method was found superior to Bis...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006